<<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019 Report - VU innovations in Scholarly Communication ======= Rapport - VU innovations in Scholarly Communication >>>>>>> Update repo from UBVU/vu101report (#1) <<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019 ======= >>>>>>> Update repo from UBVU/vu101report (#1)
<<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019

1 Introduction

The VU University Library wants her services to be closely connected to the scholarly communication processes of the VU researcher.

The way researchers share and claim their findings (scholarly communicatication) is changing under the influene of the application of new communication technologies. int he digital era, the medium has changes, and influences the way scholars communicate: new ways to discover the latest scientific information, setup faster analyses designed for reproducibility, write collaboratively, publish articles and reusable data in new ways on the internet, do the outreach of findings in more then one channel, have the assessement of findings done in different ways.

This project investigates the use of digital tools used in scholarly communication within the VU and VUmc. By pooling in with a global research from Utrecht Univerisity by Kramer and Bosman [1], we can compare our results with other countries.

[1] Kramer, B. and J. Bosman, Innovations in scholarly communication - global survey on research tool usage [version 1; referees: awaiting peer review]. F1000Research 2016, 5:692 (doi: 10.12688/f1000research.8414.1)

The results in this report are ment as conversational material with the faculties to improve library services, in such way they are better improving the work of researchers in their activities in scholarly communication.

1.1 Interactive annotating

To bring tools for advancing scholarly communication our selves, we added web anotation to this report using Hypothes.is As a reader you can also can make public and private annotations on this URL of this report. (See top-right corner) You can start highlighting the content in this document and save your thoughts for later, or to share with others, right from the browser as you read along. We do encourage you to do so. (In case you save the html file, be aware that web annotations are based on the web address, if the address changes, the annotations will not move with it.)

2 Method

To start, we have to realise that the survey design was a given to us. We were not involved from the start, but joint this research project from the Utrecht University when it was well underway. The goal of the researchers Kramer and Bosman was to gain insign in what digital online tools were used during different activities of scholarly communication.

2.1 Research questions

Our goal is to improve library services, so we first asked library employees which answers they would like to get from the survey data, devided in two parts: 1. Tool usage of the VU and VUmc researchers as a monolithic group, with comparissons to other parts of the world. 2. Tool usage of the different disciplines of VU and VUmc, also with comparissons to other parts of the world.

All these questions can be found in this document containing the VU101 research questions. You can also fnid here the prioritation of what questions should be treated first, and the others if time was left for further research.

2.2 Data gathering

Next, we need VU and VUmc researchers to fill in the survey. For this we needed a method to filter these researchers from the data, and we needed to communicate the survey to this specific target group.

For filtering VU and VUmc researchers out of the survey data, we asked the lead researchers to add a hash [7V4u8a] to the survey URL. This custom url can be used in communicating the survey. To monitor the survey activities in realtime ourselves, we created a short URL from the custom URL: [http://bit.ly/vu101innovations] . This url is used in communication to VU and VUmc researchers.

To communicate this survey to this specific targetgroup within the VU and VUmc, we asked the research portfolio managers to distribute this survey in an e-mail from the faculty secretaries. This resulted in 543 visitors in January, and an extra of 296 after a reminder in February, as you can see in the diagram below.

Web activity of the VU101survey: Number of visits; 543 in January, 296 in February.

Web activity of the VU101survey: Number of visits; 543 in January, 296 in February.

From a population of 3772 people academic staff at VU and VUmc; we had in total 839 visits (22%) to this short URL. As you can see in the results section, 531 people (14%) finaly filled in the survey.

2.3 Data preparation

In April we got the anonymised and cleaned (eg. removed whitespaces) data from the lead researchers. One set with all the data, and one with only respondents from the VU and VUmc. We prepared the data for making plots by the following procedure; changing collumns representng a tool from a string value to a bolean value, next adding a boolean to the tool column where the name of the tool has been written down in the free text field for that respondent. Then we added boolean values to the respondents who are affiliated to an institution in a OECD country. After that we marked which column belongs to what research phase in the scolarly communication cycle (Discovery, Analysis, Writing, Publication, Outreach, Assessment), and within these phases what columns represent what reearch activity (eg. searching literature, writing collaboratively, sharing data, …) After that we labeled what tool are supported by the VU library. Extra information from the tool database (eg. tool age, twitter followers and open science category) could be taken into account for making additional correllations, but due to time limitation we left this out of the answers for now.

Data Scaffold: survey dataset enriched with additional data for making plots.

2.4 Data plots, coding and report rendering

We chose R as the language to make plots, and the report in this HTML file. For each question we have made a plot in a separate R file to ease the procedure of collaboration, code-commenting and debugging. Then we nested this plot in the R-Markdown (.rdm) format where we could add commenting text like you read here now. All .rdm files are nested in a master report. This allowed us to make a report that immediately updates the plots when the code changes, keeping the graphs in sync with the latest code and data. In the end it saved us time, and makes the repoducibility of this report as easy as possible.

Code Scaffold: From data to Report; making separate plots from data, put plots in each question, include commentary, and placing each question in the report.

3 Results

Below you can find the results. The result-sections are based on the research questions. In each section you can find the plots with some commentary information explaining what the plot is about, and what results are expected and unexpected.

Throughout the results the Scholarly Communication Phases will return in each section as seperate paragraphs for Discovery, Analysis, Writing, Publication, Outreach and Assessment.

Also the plots will contain the same colors describing the Scholarly Communication Phases.

Scholarly Communication Phases: Discovery, Analysis, Writing, Publication, Outreach and Assessment

Scholarly Communication Phases: Discovery, Analysis, Writing, Publication, Outreach and Assessment

This report has the following result sections:

  • The demographics section show numbers on segments of the respondents on the survey regarding countries, carreer groups, and disciplines.
  • Popular tools section shows if there is a relation between tools that are suported by the library, and are best known.
  • OECD section shows what the differences are between the tool usage of VU and OECD countries.
  • Careergroup section show the differences between tool usage of tenured and non-tentured groups.
  • Open Science section shows the intention people have regarding open access and open science
  • Disciplines section shows the overall tool usage at the VU for each discipline
  • Detailed Disciplines section shows the tool usage for each discipline, adding comparisson information such as the OECD number for that discipline, and the VU average.
======= 101Innovationslogo

101Innovationslogo

De UBVU wil haar dienstverlening graag zo dicht mogelijk laten aansluiten bij het wetenschappelijke communicatieproces van de onderzoekers van VU en VUmc.

Het wetenschappelijke communicatieproces is volop in beweging. Dit komt door de inzet van digitale middelen binnen de onderzoeksfases: Discovery, Analysis, Writing, Publication, Outreach en Assessment.

Het project doet onderzoek naar naar wat het gebruik is van deze digitale middelen binnen het VU en VUmc, en sluit hierbij aan bij een internationaal onderzoek waardoor resultaten vergeleken kunnen worden met andere landen.

De onderstaande resultaten zijn bedoeld als gespreksstof met de faculteiten om de dienstverlening te verbeteren, vernieuwen en te veranderen, zodat ze beter aansluit bij de fases van de onderzoekspraktijk van de onderzoekers.

1 Aanpak

Er zijn onderzoeksvragen door de vak- en thema specialisten opgesteld. Dit zijn de vragen die nodig zijn om het gesprek te voeren. Voor elke vraag is een raamwerk gemaakt waarbinnen het antwoord vanuit de enquete resultaten kan worden gegeven. Vanuit het oogpunt tijd, is gekozen om alleen de vragen te beantwoorden met de hoogste urgentie.

De vragen moeten antwoord geven voor twee vraag-categoriën: 1. Het toolgebruik binnen de VU in haar geheel 2. Het toolgebruik binnen Disciplines

Met name de laatste vraag-categorie is interessant voor vak- en themaspecialisten, waar ze inzicht krijgen in het tool-gebruik binnen de discipline die ze vertegenwoordigen.

2 Data gathering

Met deze enquete hebben we meegelift bij een bestaand onderzoek van Kramer, B. and J. Bosman, Innovations in scholarly communication - global survey on research tool usage [version 1; referees: awaiting peer review]. F1000Research 2016, 5:692 (doi: 10.12688/f1000research.8414.1)

We hebben een custom URL aangevraagd waardoor VU en VUmc onderzoekers in de binnengekomen data is te onderscheiden met de hash 7V4u8a. Van deze custom URL is een verkorte URL gemaakt [http://bit.ly/vu101innovations], zodat we de activiteit van de verspreiding makkelijker bij konden houden. We hebben aan de portefeuillehoudersonderzoek gevraagd deze verkorte URL door te sturen naar hun onderzoekers.

VU101webactivity

VU101webactivity

In twee e-mail acties leverde het in januari een activiteit op van 543 bezoekers, en in februari 296 bezoekers, in totaal 839 bezoekers.

De VU en VUmc hebben samen ongeveer 6000 personen wetenschappelijke staf.

3 Resultaten

De vraag-categorien komen ook terug in de nummering van de resultaten.

Elke vraag bevat een antwoord, aangevuld met diagrammen. Een vraag begint met een samenvattende uitleg en diagram en daarna volgen de sub-secties met gedetailleerdere diagrammen.

De Scholarly Communication Fases zullen gedurende het hele rapport terug komen: Discovery, Analysis, Writing, Publication, Outreach en Assessment.

VU101innovations

VU101innovations

>>>>>>> Update repo from UBVU/vu101report (#1)

4 Demographics

These demographics form the baseline of our study.

4.1 Survey outcomes

Number of respondents Value
World Wide 20663
OECD countries 15752
Netherlands 2041
VU and VUmc 531
<<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019

=======

>>>>>>> Update repo from UBVU/vu101report (#1)
## NULL

Survey Respondents Worldwide More global demographics…

The values below are within the set of VU & VUmc respondents.

Discipline (multi-choice) Value
Physical Sciences 39
Engineering and Technology 35
Life Sciences 144
Medicine 181
Social Sciences and Economics 176
Law 26
Arts & Humanities 55
<<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019

=======

>>>>>>> Update repo from UBVU/vu101report (#1)
## NULL
Role Value
Number of PhD’s 230
Number of PostDoc’s 70
Number of (Associate, Assistant) Professors 188

## NULL
First publication year Value
before 1991 61
1991-2000 70
2001-2005 55
2006-2010 79
2011-2016 168
not published (yet) 96

## NULL
Country of affiliation Value
Netherlands 519
United States 3
Germany 2
Brazil 1
DR of Congo 1
India 1
Italy 1
Latvia 1
Turkey 1

## NULL

4.2 Organisation demographics VU&VUmc

Below the numbers are given for the active scientific personel on 30th of June 2016 for the VU. For VUmc (Medicine), numbers from annual report 2015 are used.

<<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019 =======
>>>>>>> Update repo from UBVU/vu101report (#1) <<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019 ======= >>>>>>> Update repo from UBVU/vu101report (#1)
Faculty Number of scientific personnel
Theology (Godgeleerdheid) (1) 86
Humanities (Geesteswetenschappen) (2) 221
Law (Rechtsgeleerdheid) (3) 219
Social Sciences (Sociale Wetenschappen) (4) 224
Economics and Business Administration (Economische Wetenschappen en Bedrijfskunde) (5) 430
Sciences (Exacte Wetenschappen) (6) 390
Earth and Life Sciences (Aard- en Levenswetenschappen) (7) 450
Behavioural and Movement Sciences (Gedrags- en Bewegingswetenschappen) (8) 422
Medicine (Geneeskunde) (VUmc) (9) 1079
Dentistry (Tandheelkunde) (ACTA) (10)Godgeleerdheid 86
Geesteswetenschappen 221
Rechtsgeleerdheid 219
Sociale Wetenschappen 224
Economische Wetenschappen en Bedrijfskunde 430
Exacte Wetenschappen 390
Aard- en Levenswetenschappen 450
Gedrags- en Bewegingswetenschappen 422
Geneeskunde (VUmc) 1079
Tandheelkunde (ACTA)251

## NULL

4.3 Survey disciplines and faculty

<<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019

In this table we put the number of respondents from each discipline next to the number of academic staff of each faculty. To know the response rate for each discipline, we need to know the number of VU respondents for each disipline and the total number of VU potential respondents that could have filled out the survey.

For the first number we count the numbers of disciplines in VU respondents data. For the potential total response size for each discipline, we try to match the discipline from the survey to the faculties. This way we get an impression of the response rate of each faculty.

We realise that the seperation and unifications are made some what artificial, and we have to remind the reader that one respondent could select multiple disciplines.

To know the total number of academic staff that could have filled out the survey for that discipline, we used the number of VU scientific personel to represent a survey discipline is calculated by splitting or joining the number of academic staff at a faculty.

We split the number of the Science faculty to respresent the disciplines of Physical sciences and Tecnology and Engineering. And we joined the faculties Behavioural and Movement Sciences AND Earth and Lifesciences to represent the respondents for the discipline Life Sciences. Also we joined the faculties Medicine AND Dentistry to represent respondents from Medicine. Faculties Social Sciences AND Economics and Business Administration were joined to represent respondents from Social Sciences and Economics. Also the faculties Theology AND Humanities were joined to represent the respondents from Arts & Humanities.

The percentage of responses are calculated by relating the number of VU respondents on the survey for each discipline, to the total number of academic staff that could have filled out the survey for that discipline.

=======

To normalize the numbers, and calculating the representation of the the survey respontents to the faculties, the following divisions are used.

>>>>>>> Update repo from UBVU/vu101report (#1) <<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019 ======= >>>>>>> Update repo from UBVU/vu101report (#1)
Survey Discipline Number of VU respondents (from survey) Faculty Number of VU scientific personnel (calculated potential) | % of response
Physical Sciences 39 Sciences (0.5) (1) 195 | 20%
Engineering and Technology 35 Sciences (0.5) (2) 195 | 18%
Life Sciences 144 Behavioural and Movement Sciences AND Earth and Lifesciences (3) 872 | 17%
Medicine 188 Medicine AND Dentistry (ACTA) (4) 1330 | 14%
Social Sciences and Economics 176 Social Sciences AND Economics and Business Administration (5) 654 | 27%
Law 26 Law (6) 219 | 12%
Arts & Humanities 55 Theology AND Humanities (7) 307 | 18%Physical Sciences Exacte wetenschappen 195
Engineering and Technology Exacte wetenschappen 195
Life Sciences Aard- en Levenswetenschappen 450
Medicine Aard- en Levenswetenschappen AND Geneeskunde AND Tandheelkunde (ACTA) 1780
Social Sciences and Economics Sociale Wetenschappen AND Economische Wetenschappen en Bedrijfskunde 654
Law Rechtsgeleerdheid 219
Arts & Humanities Godgeleerdheid AND Geesteswetenschappen 307

## NULL

6 Comparing VU&VUmc to OECD averages

The 101 Innovations survey received responses from many different countries. This makes it possible to compare responses from VU & VUmc faculty with international averages. We limit the comparison to the 34 OECD member states (checked 3 May 2016; note that the list also includes the Netherlands). Although this list comprises many different countries, responses from these countries are more comparable to the VU than those from the complete list, and allow for a more meaningful comparison. For example, respondents from countries with low GDP often use Zotero (free of charge), while EndNote (paid) is used more in countries with a higher GDP.

In the subsections below, we discuss the different research phases in turn. In each of the figures, hatched bars represent responses from VU and VUmc staff, and filled bars represent all other respondents from OECD countries. All data is represented as the percentage of all respondents in that group. For example, in the graph for Discovery_search it is shown that about forty percent of all OECD respondents have indicated using PubMed for Search, and about fifty percent of all VU&VUmc staff.

Overall, differences between OECD and VU repondents are not very large, and it is difficult to discern structural deviations at the VU from the OECD averages. One thing that is noteworthy and of possible interest to the Library is that usage of Mendeley for discovery and reference management is higher at the VU than in other OECD countries, despite access to the Endnote application.

6.1 Discovery

The triplets of VU and VUmc favourites—Acrobat Reader, Google Scholar, institutional access— are popular in the OECD as well. For searching, PubMed and Mendeley are used more often by VU staff, while Scopus is used less (no surprise here, since the VU has no subscription at the time of the survey).

6.2 Analysis

Use of SPSS as a tool for analysis is almost twice as large at the VU than for the OECD average.

6.3 Writing

Mendeley users for reference management are strongly represented at the VU. The preference for Mendeley is at the expense of all other tools except Endnote. For writing, VU respondents are relatively traditional, with high usage of MS Word and low usage of Google Docs and LaTeX.

6.4 Publication

Scopus usage is relatively low. Few VU respondents use the institutional repository for archival.

6.5 Outreach

There are no clear differences between the VU or OECD averages. VU respondents seem to use more tools for outreach to the general public (mainly through Twitter, WordPress, and the Wikipedia), but differences are not substantial

6.6 Assessment

Although few OECD researchers use tools for reviewing, VU researchers fall below this international average. For measuring impact, AltMetric is less popular with VU researchers than in the OECD. The lower use of Scopus also stands out, but Scopus is not supported at the VU.

7 Tenured vs non-tenured researchers

In this section, we report on differences in tool usage between tenured and non-tenured researchers. We consider assistant professors, associate professors and full professors as tenured faculty; PhD students and postdoctoral researchers are grouped as non-tenured.

The first set of graphs is a quick summary with the tools that show the most pronounced differences between the two groups. We calculate the difference by substracting the use in the tenured group from the use in the non-tenured group (both as percentages). The upper bars show the largest positive difference (ie, the tool is more popular among non-tenured researchers); the lower bars show the largest negative difference.

The second set of graphs below we show the most pronounced differences to the far right and left of each diagram. Here we see all tools in the survey sorted by research phase and research activity. We calculate the difference by substracting the use in the tenured group from the use in the non-tenured group (both as percentages). The bars on the far-right show the largest positive difference (ie, the tool is more popular among non-tenured researchers); the bars on the far-left show the largest negative difference (ie, the tool is more popular among tenured researchers).

7.1 Discovery

The difference in use for PubMed and table of content announcements for journals stand out as the most significant discrepancies in the Discovery phase. Although not featuring in the ‘top-2’ figures, the use of Mendely stands out when inspecting the more detailed graphs: non-tenured (generally younger) researchers use Mendeley more often in the Reading, Searching and Alert activities within the Discovery phase.

7.2 Analysis

Tool use for analysis is stronger with non-tenured researchers across the board. This holds for relatively new (and more open) tools such as R and Python, as well as for long-standing software such as Excel and MATLAB. The large difference for SPSS is no outlier. Tools for sharing analysis scripts are not very popular, and tool usage is low overall. Somewhat unexpectedly, use of the Open Science Framework is stronger for the tenured then for the non-tenured group. This could have to do with some cases where that the OSF is often used for grant applications, and that this arguably is a more important activity for tenured researchers.

7.3 Writing

The importance of Mendeley in the research workflow of non-tenured researchers is again apparent in the Writing Phase. Among this group, Mendeley is the most popular reference management software, more popular than Endnote—the most popular reference tool for tenured researchers. For the writing itself, MS Word is by far the most popular tool among both groups.

7.4 Publication

In general, tenured researchers use more tools in the Publication phase; probably they simply publish more. This makes it difficult to interpret these figures properly. A few tools stand out. First, PubMed is relatively popular for archival of publications for non-tenured researchers, although in absolute terms ResearchGate is the most popular repository for both groups. GitHub is used mostly by non-tenured researchers as a repository for scripts and software code.

7.5 Outreach

Tenured researchers seem to spend more effort on their research profile, as tool use in this phase is higher for that group. ResearchGate is popular among both groups. Although to a lesser extent (differences are less pronounced), tenured researchers also use more tools for outrecach to a broader public.

7.6 Assessment

The difference is use of Web of Science indicators for impact assessment is striking: about 55% of tenured researchers indicate using the tool, versus appraximately 20% of non-tenured reseachers. Altmetrics and the PLoS metrics are not very popular (yet) in comparison, and are used by both groups, although slightly more by non-tenured researchers.

8 Open Access and Open Science

8.1 Open Intention for Tenures vs Non-tenures

Despite of people are not always familiar with the tools that they can use for engaging in Open Science, they overall have the intent to be supportive in favor of Open Acces an Open Science.

Apart from the multiple-choise questions, there was one open question:

“What do you think will be the most important development in scholarly communication in the coming years?”

This question got a tremendous response (N=341; 64%) , which give a better view on their positon on their worries and hopes on the future of conducting science. The answers shows what the directions are academic support could invest in.

What we did with the free text answers is; First we labeled all answers if they express a threat or opportunity, then we filtered to all senior researchers (N=188; 35%) to get a managable sample, and last we labelled the answers on which research phase and activity the answer was related to. This way we hope to get a perception of their worries and hopes for the future of Scholarly Communication.

8.1.1 Worries

The professors expressed their worries (N=6, 1%) for the following, that The number of journals and locations where publications can be found will increase, and if journals disappear in cases where knowledge is democratised in a wikipedia fashion, it is hard to trust or distinguish quality of a scientific paper or fact. (Related to Discovery-Search and Assessment-Review) In the realm of communicating through journals as the only option for publishing, we see that on one hand there are worries about open access journals who ask to high of a price to publish, and on the other hand there are worries about the quality of open access journals, not only to read, but also to publish. (Related to Discovery-Access and Publication-Submit) In other words, the current perception is that there lacks a system that can create a market to increase the quality and trustworthiness of open access journals and reduce the price for publishing. (DOAJ.org and QOAM.eu are means for making quality of open access journals transparent.)

  • The Wikipedification of science: everybody’s opinion will be counted equally. With devastating effect.
  • First Open Access pest will continue to explosively grow; Then, we will all be shocked about the mess, discuss how this was possible and try (but fail) to find those to blame; Return to pay for quality system instead of pay for publishing-no-matter-how-bad
  • As always in history, tendencies are like swings or pendula, going to and fro. The tendency of recent past to open up, to tear down walls and borders is approaching its apex. We will witness a return movement towards closing, protecting, guarding. Just as we see this happening in the world with respect to national boundaries. Academia is slowly already moving from a free space towards distinct ‘gated communities’ so to speak. Of course that tendency eventually will reach its apex as well. After which a renewed swing towards openness will take place. Probably not in my lifetime, however.
  • Increase in number of journals and publication locations
  • Unfortunately, more formulas to quantify science Output/Impact across very diverse fields.
  • Development of prices for journals and for publications in journals. Prices are too high, and resistance is growing.

8.1.2 Hopes

The hopes the professors (N=53, 10%) expressed that the future of scholarly communication should bring were the following for the different research phases:

8.1.2.1 Discovery

As a researcher one needs to know what is the latest on your field, but do not need to be swamped in irrelevant information, and once a relevant paper is found there must be the ability to read it without restriction. The ultimate hope is that there is no need to read through all the papers but to get alerted to only the latest and necessary information for your current research domain, from the complete scientific corpus, open accessible for knowledge aggregators, and human provenance readers. But in terms of more traditional scholarly communication, there is a hope for more clarity on the quality of the scientific content, so that the few papers that remaining in the scientific corpus are more high in quality, or at least can be filtered on quality standards. When found there is the hope that one day there will be no need to pay and gain free access to all that scientific content. Once there is access to a paper there is a hope one can engage with more interaction to the content - see for example readcube.com, utopiadocs.com - and interaction with other readers - see for example hypothes.is -

8.1.2.1.1 sample of answers:
  • Search

  • reduction of output (less is more)
  • tools that aggregate scattered knowledge in papers;
  • More clarity about the quality of journals.
  • Researchgate as primary tool

  • Access

  • open access online, platform for sharing between researcher without paying
  • scientific articles as ‘free’ pdf in the ‘cloud’
  • Researchgate as primary tool

  • Read

  • Projects like Reader which allow much more interaction with papers
  • Researchgate as primary tool
  • tools that aggregate scattered knowledge in papers;

8.1.2.2 Analysis

When designing doing the operation of the research, the hopes are that there will be more emphasis on replication by working in concurrence, preregistering protocols and experiment designs. See for example osf.io, protocols.io, scientificprotocols

Also the hope is that working in concurrence forces to deliver improved descriptions of methods and increase the quality of the data. The hope is expressed that this will be required for any valid publication.

Research is so diverse and heterogeneous, there is no one tool to nail the job, but the scientific endeavor needs different tools for different research groups in different phases. To excel in science, the hope is that the university will apply ‘super segmentation’, to give researchers access to the best tools for the right task, at the right moment. The 101 research tools data base could be a start for this.

8.1.2.2.1 sample of answers:
  • Sharing

  • More emphasis on replication
  • More concurrence
  • It should be making Science reproducible again: better description of the methods, better quality data and structured obtained knowledge should be required for any publication.
  • Publication based on preregistration of protocols/experiments

  • Analyzing

  • One important concept in marketing is the idea of “super segmentation”. Lots of excellent tools are coming available, but they are differentials relevant for different people, and different people need to integrate them. It will become a challenge how to integrate that most optimally. My suspicion is that the university- slow as it is- won’t be able to handle this development. So either the universities adapt and let people operate in start-up like enterprises within the university, or good researchers break away from universities

8.1.2.3 Writing

The hopes for scholarly communication in the writing phase is that researchers hope for more responsive communication, without losing credit for their contributions. Writing could be done in smaller iterations, where the feedback helps in building up towards completing a milestone. This is like contributions in the open source software industry where working with nightly-build and milestone releases is common practice.

One example of a platform that are built with responsive communication, version control, collaborative editing for mixed teams with LaTeX and rich-text writers, and commenting during drafting or after publishing, is overleaf.com

Also the hopes are to get support in making complex material easier to fathom by visualisations and animations.

8.1.2.3.1 sample of answers:
  • Write

  • online communication tools
  • Open Access but also sites like Academia, where you just put your text online and receive feedback, even without a publisher in between.
  • Supporting written material by animation.

8.1.2.4 Publication

Hopes are that for deciding where to submit a paper other indicators than the journal impact factor will be considered like: author’s right to open access the publication, the publication limit or APC budgets, quality factors of a journal - see qoam.eu and doaj.org - , data appliance policy, or even the necessity of publishing with a journal as intermediate can be questioned.

For publishing text hopes are up for open access, removing gatekeepers and enabling text- and datamining, but there is more to that. Hopes are also up for faster publication rates of smaller research deliverables in other outlets than journals, by disseminating and claiming ideas in pre-print-proofs, preregistering the research design or hypothesis, store the findings with the processed/intermediate data along with the code/tool/app/container including the raw data used for processing, publish the final results with links to the data, and evaluate results and data in peer-review channels. Some examples of platforms that are built for “publish now, review later” - f1000research.com - , preregistering and sharing intermediate results - osf.io -,

For hopes related to storing data, have a look at re3data.org for a complete overview of data repositories worldwide where you can filter on subject, quality seals, persistency policy, reuse licenses and more.

Of course big-data lovers have hopes that peer reviewed results can be fuel to create knowlets and nanopublications, to create big-data knowledge graphs to improve scientific discovery methods, and emphasize research opportunities in networks across researchers. - nanopub.org -

8.1.2.4.1 sample of answers:
  • Submit
  • The manner in which publishers contract with institutions and authors about access and publishing.
  • Journals will lose dominance in favor of direct online publication, review and discussion
  • A cap on the # of papers published per researcher, to promote quality over quantity
  • More clarity about the quality of journals.
  • Elsevier will loose marketshare

  • Publishing

  • the further growth of open access journals
  • open access without print, fast publication rate (2 to 3 weeks following submission, see eLife)
  • Open access of publicly funded research.
  • detach scientists from traditional journal format and publishing
  • open access, other outlets than journals
  • Open access and blogging
  • Open access; Open data; preregistration; bypassing the regular journals
  • The need to advertise your own work and not get lost in the mass of papers that are published every day. In addition, the desire to make data available through repositories (such as a website, data dryad, or figshare)
  • Grey literature. Pre-print proofs of articles shared on the web, online discussions. So I think that official peer reviewed channels will remain important for evaluations, but less for the dissemination of ideas.
  • Open access + Publishing of raw and processed data and intermediate results
  • nanopublications and shareable knowlets
  • Researchgate as primary tool

  • Archiving

  • Self-archiving, Open Access

  • Datasharing

  • Sharing data
  • open acces publication and data sharing
  • sharing of data sets
    1. Open data & analysis; (2) tools that aggregate scattered knowledge in papers; (3) open access
  • sharing research data for reproducibility of research results
  • Open access to data, open review
  • The need to advertise your own work and not get lost in the mass of papers that are published every day. In addition, the desire to make data available through repositories (such as a website, data dryad, or figshare)
  • Open access + Publishing of raw and processed data and intermediate results
  • datasets on which publications are based will become available
  • online availability of data accompanying publications
  • Publication in the form of tools, apps to interpret data

8.1.2.5 Outreach

Hopes are to make it easier to advertise and track your own work on social networks like researchgate, academia.edu, twitter, and linked-in. For managing and tracking your outlets see for example growkudos.com, or other online trainings and workshops.

Also the hopes are, in this phase to get support in making complex material easier to fathom by visualisations and animations.

8.1.2.5.1 sample of answers:
  • Profile and Popular

  • Shift to open access publication and the shift from author’s own websites to Research Gate and other networks of that type.
  • Social Media such as ResearchGate
  • The need to advertise your own work and not get lost in the mass of papers that are published every day. In addition, the desire to make data available through repositories (such as a website, data dryad, or figshare)
  • Researchgate as primary tool

8.1.2.6 Assessment

Hopes are to decouple peer-review from the publishing location or traditional journal, this makes a self-published and self-promoted a contribution to your field recognisable as a evaluated and valid part of the scientific corpus. At the same time there is hope that the effort researchers put into rigorous peer-reviewing the work of others gets recognised as well. This can be by open interactive discussion in the comment section, or in dedicated peer-review channels. For solutions in decoupling peer review and getting recognition: have for example a look at publons.com, peerageofscience.org, rubriq.com . For informal open discussions have a look at hypothes.is

For impact hopes are that impact will be not be on the quantity of publications one can crank out, but on the quality of the contribution to science. Earning the recognition could be your contribution on github for your code that gets re-used and forked, or the badges you have earned during different parts of your contributions to your research. See for this openresearchbadges.org, To yet quantify the alternatives for scientific impact, have a look at impactstory.org or altmetric.com

8.1.2.6.1 sample of answers:
  • Review

  • modernizing peer review and publishing beyond topical journals
  • “Open access, open review and open discussions.
  • Projects like Reader which allow much more interaction with papers"
  • A change in how the reviewing process takes place. Some journals have already adopted a hybrid reviewing process that enables different types of submission procedures. I believe this trend will continue in an attempt to make the publication process more objective and less prone to all kinds of unethical behavior in order to conform to publication pressure.
  • online discussions such as on research gate
  • Open access to data, open review
  • interactive comments
  • Grey literature. Pre-print proofs of articles shared on the web, online discussions. So I think that official peer reviewed channels will remain important for evaluations, but less for the dissemination of ideas.
  • Post publication peer review opportunities, because peer review fails

  • Impact

  • I hope a shift away from quantity to quality
  • I am convinced that the absurd focussing on bibliometrics will diminish if not vanish
  • assessment on how to monitor impact

<<<<<<< 136d71d566ce87725aa94e5bb74c3f15a3fc2019
=======

9 Tools per discipline

As a quick summary we hafe made a table to show the one most used tool per research phase in each discipline. A more detailed explanation is given in the section below the table, but for the one most used tool we can state the following:

  • Discovery: Most disciplines use Google Scholar to discover new literature. Medicine use PubMeb as their primary source for search. One could say that Lifesciences find having campus access to literature more important than searching for that literature, but in the detail section below we see a more elaborate explanation, where their attention for search is spread between Google Scholar and PubMed.

  • Analysis: MS Excel is the most popular tool for analysis in all disciplines, except for Medicine where they use SPSS.

  • Writing: Here MS Word is the most popular tool for writing in all disciplines, except for Engineering&Technology where they use LaTeX.

  • Publication: Pubishing in Traditional Topical journals is still by far the most popular publication method, despite of the high support for Open Access.

  • Outreach ResearchGate is the most popular platform for profiling your research within the research community, except for two disciplines who use RG slightly less. Engineering&Technology use Google Scholar Citations a bit more, and Arts&Humanities use Academia.edu more.

  • Assessment: Physics, Medicine, Lifesciences and Law use Web of Science for assessment of their research, and the oter disciplines use the Journal Citation Register, which both contain the same impact factor calculated from journals in the ISI database. Internationally there is a lot of debate going on if the merit of an article should count, and not the merit of the journal. Also discussed is the reward sysem to give credit where credit is due.

In the sections below we show the tool usage for each research discipline next to each other. This gives us the opportunity to see if there is a discipline usign a tool more or less than others.

9.1 Discovery

  • For reading articles the majority of the disciplines use pdf, half of them read online in a browser. Also Mendeley is pretty known, but lesser in Law and Arts & Humanities, where they seem to prefer to use iAnnotate instead. ReadCube seems to have a targeted audience for Physics and Engineering. Hypothesis and UtopiaDocs are promising tools young tools and haven’t have a big uptake yet.
  • Google Scholar is overwhelmingly used for searching literature by all disciplines, adding to that for medicine and life sciences pubmed is widely used. World Cat is mostly used by Law and Art & Humanities. From the two competing bibliographic databases Scopus and Web of Science, the later has -not surprisingly because the VU Library has a license- more uptake, especially by Physics and Social sciences and Economics. We are surprised that Scopus is used, where people should get access from elsewhere. Most users from Scopus are in Physics and Engineering. Compared to OECD countries, these countries use Scopus almost a factor three more than VU and VUmc. Mendeley however, also an Elsevier product as Scopus, is used as an alternative to search for literature, mostly by Life sciences.
  • The alert services to discover new literature is less known, or lesser used because of the annoying over abundance of e-mails in the mailbox. To tackle this problem young services like F1000 Prime provides hand picked curated recommendations from senior researchers. And Sparrho uses adaptive algorithms to present only articles relevant for your specific research field from a wide variety of sources.
  • To gain access to literature all disciplines rely on the subscriptions for on-campus access. When not on campus or then journals fall out of the subscription packages alternative methods are used, like ResearchGate, and asking the author directly where the relationships are close. These results are no different compared to other OECD countries. Even the browser plugin Open Access Button is used, mostly in medicine and life sciences, to gain access to toll gated literature, either by searching for the open access alternative / author version, or by finding the e-mail address of the author. Pay per view is rarely used, but also the model for renting articles in services like Deepdyve - a Spotify model for scientific articles - is still an unknown anomaly in scholarly communication. One could wonder which legal alternative will increase, when subscriptions for on campus access end. Discipline Colors LegendLegend for the Discipline Colors

9.2 Analysis

  • Sharing the method or workflow for the analysis is nowadays common practice to be a part of the article. To have a separate platform for sharing the analysis to make the research more easily reproducible is not common practice yet. This might be because there is no honor in reproducing research, but in advancing science with new findings. But these platforms can also be used to pre-register a hypothesis and method. We see in Social science an more familiarity of the Open Science Framework where there is more attention to pre-registration of hypothesis and replication studies. Also in medicine there is a little bit of familiarity with a service like scientific protocols.

    • Other sharing methods mentioned: Evernote, OneNote, Google Keep, Media Wiki, Google Drive, SURFdrive, Apple iCloud, Dropbox, E-mail, Institutional shared folders, Basecamp, GitHub, E-Notebook, eLabjournal.com, ResearchGate, Mindly, Paper, Trello, Design paper, Netherlands Trial Register, project websites, Podio, Clinical trial.com (edit:clinicaltrials.gov)
  • For the analysis Many disciplines use their specific tool for analysis. Excel is the common tool for at least 50 per cent of the Law and Arts&Humanities communities, and even more for the other disciplines. iPhython, R and Matlab are uses mostly by the Physics and Engineering&Technology, where R is also known by Life scientists. And SPSS is the commercial package that is used intensely in medicine, social-science&economics and life sciences. Unknow yet but interesting for digital humanities is the DHbox and R open science both with ready-to-go configurations of computational tools, the first as runtime environment in the cloud with R and iPython, the other an extensive software library for R. In the survey the following tools are mentioned by VU and VUmc researchers in different disciplines. Some of them were mentioned frequently like Nvivo or across disciplines like Atlas.TI.

    • Physics: Fortran, Wolfram Mathematica, Linux, GAMS, ArcGIS, Origin, Gradeprofiler, Python, Java, C++, Comprehensive Meta-analysis (CMA), SAS, Mplus, Galaxy, PQ method, Atlas.TI, open office spreadsheet, Glotaran, R package TIMP
    • Engineering & Technology: Atlas.TI, OpenRefine, Python, Oxmetrics, Semantic web platforms, ACQknowledge, GraphPad, open office spreadsheet, Java, Glotaran, R package TIMP
    • Medicine: MaxQDA, Atlas.TI, ReviewManager, GraphPad Prism, SAS, Mplus, Apple Numbers, Stata, Comprehensive Meta-analysis (CMA), StatView, Review Manager (systematic reviews), FSL, Flowjo, MS Access, Snapgene, Accurri Analysis, instrument specific software, MLWIN, MS Word, Mindmeister, clonemanager, softmax, Vinci, Galaxy, Python, Java, C++, ACQknowledge, openMx, Plink, wolfram mathematica, PQ method, Libre Offic, picture analysis, Statistica, Spike/Signal
    • Social Science & Economics: Atlas.TI, Mplus, STATA, QSR Nvivo, MaxQDA, C++, SAS, Review Manager (systematic reviews), Comprehensive Meta-analysis (CMA), Python, MS Word, Transana, Mlwin, Oxmetrics, QGIS, Amos, Lyx, winedt, JASP, Wolfram Mathematica, Gephi, UCINET, NodeXL, ORA, ConText, Netdraw (social network analysis software), AmCat (Amsterdam Content Analysis Toolkit), Python package Pandas, ArcGIS, MaxQDA, GAMS, SAS, MaxQda, Atlas.TI, Mindmeister, SmartPLS, Dedoose, Lingo software, maxima, EQS, PQ method, MS Access, fs/QCA software, Lisrel.
    • Law: Atlas.TI, MS Word, various text mining tools
    • Arts & Humanities: MS Access, MS Word, MPlus, Atlas.TI, Python, MaxQDA, Concordance software e.g. AntConc, AmCat (Amsterdam Content Analysis Toolkit) Discipline Colors LegendLegend for the Discipline Colors

9.3 Writing

  • MS Word is the most favorite office tool for writing, but Engineering and technology use Google docs and Overleaf more often for collaborative writing than other disciplines.
  • For managing references Endnote is the most popular for all disciplines except in Engineering&Technology, where they prefer Mendeley. Endnote and Mendeley have a a similar user base, except that Mendeley is being used vastly more by younger researchers. Something to look into in the future, but we can imagine that many Phd candidates want something that works right after a download, instead of getting a license token from university administration. Where Mendeley is used lesser by Arst&Humanities, this group does use the open source reference manager Zotero much more than the other disciplines. Discipline Colors LegendLegend for the Discipline Colors

9.4 Publication

  • Despite the fact that all disciplines publish in traditional journals, it are researchers in medicine and life sciences who publish in OA topical- and mega-journals.
  • To decide what journal to submit an article the Journal Citation Ranking is still a leading indicator for most disciplines, except for researchers in Law, where they seem to lead in journal assessement platforms with an open access focus like the directory of open access journals (DOAJ), quality open access market (QOAM), Sherpa/Romeo and Journalysis. Also mentioned was the Eigenfactor.org, advise from supervisors and peers, the metrics from Google Scholar, similarities and reputation from authors in reference lists
  • Most disciplines recognize ResearchGate as a place for archiving and sharing publications. Archiving scientific output to safeguard the corpus for future generations is not as common practice yet for all disciplines, but Physics and Engineering&Technology mostly use arXiv for years to publish prepints as a function to speed-up the scientific process and at the same time claim their finding at that particular date. Other expected patterns that are visible are Lifesciences and Medicine use PubmedCentral, and the institutional repository is known across all disciplines. Strangely SSRN is used by Law a lot more than expected in Social sciences. We expected BioRXiv to be familiar among the Life sciences, but the service just started a few years ago.
  • Although a plentitude of platforms are available to share and archive data, code and presentations, only Github and Bitbucket are mainly used by one discipline; Engineering&Technology. Other suggestions were given to share code, data and presentations: Open Science Framework (OSF), Dropbox, Onedrive, SURFdrive, SURFsara.nl, Institutional shared folder, SPSS, Survey Monkey, External Hard Drive, E-mail, EDUgroepen.nl, Openclinica, Mendeley Data, SVN, GEO, Gitlab, EGA, tranSMART, B2SHARE, own website, supplement of papers, Academia.edu, dedicated repositories, wetransfer. For archiving data in a discipline the best place to start is the Registry of Research Data Repositories at RE3data.org Discipline Colors LegendLegend for the Discipline Colors

9.5 Outreach

  • We see that Engineering & Technology use Google Scholar Citations for researcher profiles more than other disciplines. This might be related to the recent obligation from the faculty of sciences to use this channel as the official outlet of their academic work. Again ResearchGate pops up as a platform that is broadly used to display your work in all disciplines. Only where we see two dips for Arts & Humanities and Law at ResearchGate, we see them re-appear as spikes at Academia.edu. Also we see that these disciplines are less familiar with ORCiD than the other disciplines. The profiles at our own institution are familiar, but we hear complains about the lack of control and speed researches have to influence these pages. That’s why we respondents also mentioned own website a number of times.
  • Twitter is the most popular outlet for mentioning scientific findings to the public by Arts&Humanities and Social&Economics, but also Engineering&Technology like to use to show off their achievements. The same groups like to inform the public by placing infrormation on their work on websites or blogging platforms like Wordpress. In smaller numbers, but fairly distributed across all disciplines improve Wikipedia with their findings. The startup GrowKudos.com, which manages the distribution and measurement the impact of your work on blogs and social media networks (Tw,Fb,Ln), is still very unknown. Other mentions for public outlets were Newspapers, Facebook and Linked-in.
  • To archive posters and presentations is not common practice, but only for Engineering&Technology they use Slideshare, Figshare and Vimeo. Also mentioned was Prezi, Dropbox, Youtube, own website, ResearchGate and Academia.edu. Discipline Colors LegendLegend for the Discipline Colors

9.6 Assessement

  • Using services for peer review organized beyond that by journals is very unknown territory. The common practice is to go with the review process organised by a journal. Decoupling this process makes it possible to validate the research and maintain trust in the scientific findings, but publish and spread it to multiple platforms creating greater reach. Other methods mentioned was discuss with peers.
  • To measure the impact of one’s output, Lifesciences, Medicine, Physics and Social&Economics look at the Journal Citation Reports (JRC) and Web of Science as a reference. Other mentions were Google Scholar citation index and in Social&Economics the Eigenfactor. Also InspireBeta.net in high energy physics is a good example for profile metrics; pre-print/post-print ratio, citation breakdown in clusters, filter on self-citation, publication type, co-authors, keyword frequencies, publication timeline graph, etc. And the ERIM journal list (EJL) from the Erasmus University is a good example of assessment for impact, but scaled to a discipline specific area with their own additional criteria. Discipline Colors LegendLegend for the Discipline Colors

10 Detailed overview for each discipline

In this overview we show the graphs focussed on each discipline in each research phase. Next to these bars we will place additional bars, where you can compare the discipline against the VU&VUmc average, and the OECD averages for that discipline.

legend for tool usage within disciplines compared to VU average and OECD average for that discipline

10.1 ArtsHumanities (N= 55 )

10.1.1 Discovery

10.1.2 Analysis

10.1.3 Writing

10.1.4 Publication

10.1.5 Outreach

10.1.6 Assessment

10.2 EngineeringTechnology (N= 35 )

10.2.1 Discovery

10.2.2 Analysis

10.2.3 Writing

10.2.4 Publication

10.2.5 Outreach

10.2.6 Assessment

10.3 Law (N= 26 )

10.3.1 Discovery

10.3.2 Analysis

10.3.3 Writing

10.3.4 Publication

10.3.5 Outreach

10.3.6 Assessment

10.4 LifeSciences (N= 144 )

10.4.1 Discovery

10.4.2 Analysis

10.4.3 Writing

10.4.4 Publication

10.4.5 Outreach

10.4.6 Assessment

10.5 Medicine (N= 181 )

10.5.1 Discovery

10.5.2 Analysis

10.5.3 Writing

10.5.4 Publication

10.5.5 Outreach

10.5.6 Assessment

10.6 PhysicalSciences (N= 39 )

10.6.1 Discovery

10.6.2 Analysis

10.6.3 Writing

10.6.4 Publication

10.6.5 Outreach

10.6.6 Assessment

10.7 SocialSciencesEconomics (N= 176 )

10.7.1 Discovery

10.7.2 Analysis

10.7.3 Writing

10.7.4 Publication

10.7.5 Outreach

10.7.6 Assessment

=======

6 VU&VUmc vs OECD countries

Despite the fact that the survey has responses from many different countries, we limit the analysis to the 34 OECD member states (checked 3 May 2016), as these countries are more similar to the Netherlands, and comparison is more meaningful. For example, respondents from countries with low GDP often use Zotero (free of charge), while EndNote (paid) is used more in countries with a higher GDP.

The figures below compare respondents from VU University to respondents from OECD countries. OECD respondents are indicated with solid colored bars, VU respondent bars are hashed. All data is reported in percentages, that is, a solid bar up to 80 for google Scholar in the Discovery_search graph indicates that 80 per cent of all VU respondents reported using Google Scholar for Search in the Discovery process. We report all tools per subactivity.

Overall, differences between OECD and VU repondents are not very large, but there are a few tools that stand out.

6.1 Discovery

Mendeley is used relatively often at the VU for reading and searching

6.2 Analysis

Use of SPSS as a tool for analysis is much larger at the VU than for the OECD average.

6.3 Writing

As in the Discovery phase, Mendeley users for reference management are strongly represented at the VU. The preference for Mendeley is at the expense of all other tools except Endnote. For writing, VU respondents are relatively traditional, with high usage of MS Word and low usage of Google Docs and LaTeX.

6.4 Publication

Scopus usage is relatively low. Few VU respondents use the institutional repository for archival.

6.5 Outreach

6.6 Assessment

7 Tenured vs non-tenured researchers

In this section, we report on differences in tool usage between tenured and non-tenured researchers. We consider assistant professors, associate professors and full professors as tenured faculty; PhD students and postdoctoral researchers are grouped as non-tenured.

The first set of graphs is a quick summary with the tools that show the most pronounced differences between the two groups. We calculate the difference by substracting the use in the tenured group from the use in the non-tenured group (both as percentages). The upper bars show the largest positive difference (ie, the tool is more popular among non-tenured researchers); the lower bars show the largest negative difference.

The second set of graphs below we show the most pronounced differences to the far right and left of each diagram. Here we see all tools in the survey sorted by research phase and research activity. We calculate the difference by substracting the use in the tenured group from the use in the non-tenured group (both as percentages). The bars on the far-right show the largest positive difference (ie, the tool is more popular among non-tenured researchers); the bars on the far-left show the largest negative difference (ie, the tool is more popular among tenured researchers).

7.1 Discovery

The difference in use for PubMed and table of content announcements for journals stand out as the most significant discrepancies in the Discovery phase. Although not featuring in the ‘top-2’ figures, the use of Mendely stands out when inspecting the more detailed graphs: non-tenured (generally younger) researchers use Mendeley more often in the Reading, Searching and Alert activities within the Discovery phase.

7.2 Analysis

Tool use for analysis is stronger with non-tenured researchers across the board. This holds for relatively new (and more open) tools such as R and Python, as well as for long-standing software such as Excel and MATLAB. The large difference for SPSS is no outlier. Tools for sharing analysis scripts are not very popular, and tool usage is low overall. Somewhat unexpectedly, use of the Open Science Framework is stronger for the tenured then for the non-tenured group. This could have to do with some cases where that the OSF is often used for grant applications, and that this arguably is a more important activity for tenured researchers.

7.3 Writing

The importance of Mendeley in the research workflow of non-tenured researchers is again apparent in the Writing Phase. Among this group, Mendeley is the most popular reference management software, more popular than Endnote—the most popular reference tool for tenured researchers. For the writing itself, MS Word is by far the most popular tool among both groups.

7.4 Publication

In general, tenured researchers use more tools in the Publication phase; probably they simply publish more. This makes it difficult to interpret these figures properly. A few tools stand out. First, PubMed is relatively popular for archival of publications for non-tenured researchers, although in absolute terms ResearchGate is the most popular repository for both groups. GitHub is used mostly by non-tenured researchers as a repository for scripts and software code.

7.5 Outreach

Tenured researchers seem to spend more effort on their research profile, as tool use in this phase is higher for that group. ResearchGate is popular among both groups. Although to a lesser extent (differences are less pronounced), tenured researchers also use more tools for outrecach to a broader public.

7.6 Assessment

The difference is use of Web of Science indicators for impact assessment is striking: about 55% of tenured researchers indicate using the tool, versus appraximately 20% of non-tenured reseachers. Altmetrics and the PLoS metrics are not very popular (yet) in comparison, and are used by both groups, although slightly more by non-tenured researchers.

8 Open Access and Open Science

8.1 Open Intention for Tenures vs Non-tenures

8.2 Open Intention per discipline

9 Tools per discipline

As a quick summary we hafe made a table to show the one most used tool per research phase in each discipline. A more detailed explanation is given in the section below the table, but for the one most used tool we can state the following:

  • Discovery: Most disciplines use Google Scholar to discover new literature. Medicine use PubMeb as their primary source for search. One could say that Lifesciences find having campus access to literature more important than searching for that literature, but in the detail section below we see a more elaborate explanation, where their attention for search is spread between Google Scholar and PubMed.

  • Analysis: MS Excel is the most popular tool for analysis in all disciplines, except for Medicine where they use SPSS.

  • Writing: Here MS Word is the most popular tool for writing in all disciplines, except for Engineering&Technology where they use LaTeX.

  • Publication: Pubishing in Traditional Topical journals is still by far the most popular publication method, despite of the high support for Open Access.

  • Outreach ResearchGate is the most popular platform for profiling your research within the research community, except for two disciplines who use RG slightly less. Engineering&Technology use Google Scholar Citations a bit more, and Arts&Humanities use Academia.edu more.

  • Assessment: Physics, Medicine, Lifesciences and Law use Web of Science for assessment of their research, and the oter disciplines use the Journal Citation Register, which both contain the same impact factor calculated from journals in the ISI database. Internationally there is a lot of debate going on if the merit of an article should count, and not the merit of the journal. Also discussed is the reward sysem to give credit where credit is due.

In the sections below we show the tool usage for each research discipline next to each other. This gives us the opportunity to see if there is a discipline usign a tool more or less than others.

9.1 Discovery

  • For reading articles the majority of the disciplines use pdf, half of them read online in a browser. Also Mendeley is pretty known, but lesser in Law and Arts & Humanities, where they seem to prefer to use iAnnotate instead. ReadCube seems to have a targeted audience for Physics and Engineering. Hypothesis and UtopiaDocs are promising tools young tools and haven’t have a big uptake yet.
  • Google Scholar is overwhelmingly used for searching literature by all disciplines, adding to that for medicine and life sciences pubmed is widely used. World Cat is mostly used by Law and Art & Humanities. From the two competing bibliographic databases Scopus and Web of Science, the later has -not surprisingly because the VU Library has a license- more uptake, especially by Physics and Social sciences and Economics. We are surprised that Scopus is used, where people should get access from elsewhere. Most users from Scopus are in Physics and Engineering. Compared to OECD countries, these countries use Scopus almost a factor three more than VU and VUmc. Mendeley however, also an Elsevier product as Scopus, is used as an alternative to search for literature, mostly by Life sciences.
  • The alert services to discover new literature is less known, or lesser used because of the annoying over abundance of e-mails in the mailbox. To tackle this problem young services like F1000 Prime provides hand picked curated recommendations from senior researchers. And Sparrho uses adaptive algorithms to present only articles relevant for your specific research field from a wide variety of sources.
  • To gain access to literature all disciplines rely on the subscriptions for on-campus access. When not on campus or then journals fall out of the subscription packages alternative methods are used, like ResearchGate, and asking the author directly where the relationships are close. These results are no different compared to other OECD countries. Even the browser plugin Open Access Button is used, mostly in medicine and life sciences, to gain access to toll gated literature, either by searching for the open access alternative / author version, or by finding the e-mail address of the author. Pay per view is rarely used, but also the model for renting articles in services like Deepdyve - a Spotify model for scientific articles - is still an unknown anomaly in scholarly communication. One could wonder which legal alternative will increase, when subscriptions for on campus access end. Discipline Colors LegendLegend for the Discipline Colors

9.2 Analysis

  • Sharing the method or workflow for the analysis is nowadays common practice to be a part of the article. To have a separate platform for sharing the analysis to make the research more easily reproducible is not common practice yet. This might be because there is no honor in reproducing research, but in advancing science with new findings. But these platforms can also be used to pre-register a hypothesis and method. We see in Social science an more familiarity of the Open Science Framework where there is more attention to pre-registration of hypothesis and replication studies. Also in medicine there is a little bit of familiarity with a service like scientific protocols.

    • Other sharing methods mentioned: Evernote, OneNote, Google Keep, Media Wiki, Google Drive, SURFdrive, Apple iCloud, Dropbox, E-mail, Institutional shared folders, Basecamp, GitHub, E-Notebook, eLabjournal.com, ResearchGate, Mindly, Paper, Trello, Design paper, Netherlands Trial Register, project websites, Podio, Clinical trial.com (edit:clinicaltrials.gov)
  • For the analysis Many disciplines use their specific tool for analysis. Excel is the common tool for at least 50 per cent of the Law and Arts&Humanities communities, and even more for the other disciplines. iPhython, R and Matlab are uses mostly by the Physics and Engineering&Technology, where R is also known by Life scientists. And SPSS is the commercial package that is used intensely in medicine, social-science&economics and life sciences. Unknow yet but interesting for digital humanities is the DHbox and R open science both with ready-to-go configurations of computational tools, the first as runtime environment in the cloud with R and iPython, the other an extensive software library for R. In the survey the following tools are mentioned by VU and VUmc researchers in different disciplines. Some of them were mentioned frequently like Nvivo or across disciplines like Atlas.TI.

    • Physics: Fortran, Wolfram Mathematica, Linux, GAMS, ArcGIS, Origin, Gradeprofiler, Python, Java, C++, Comprehensive Meta-analysis (CMA), SAS, Mplus, Galaxy, PQ method, Atlas.TI, open office spreadsheet, Glotaran, R package TIMP
    • Engineering & Technology: Atlas.TI, OpenRefine, Python, Oxmetrics, Semantic web platforms, ACQknowledge, GraphPad, open office spreadsheet, Java, Glotaran, R package TIMP
    • Medicine: MaxQDA, Atlas.TI, ReviewManager, GraphPad Prism, SAS, Mplus, Apple Numbers, Stata, Comprehensive Meta-analysis (CMA), StatView, Review Manager (systematic reviews), FSL, Flowjo, MS Access, Snapgene, Accurri Analysis, instrument specific software, MLWIN, MS Word, Mindmeister, clonemanager, softmax, Vinci, Galaxy, Python, Java, C++, ACQknowledge, openMx, Plink, wolfram mathematica, PQ method, Libre Offic, picture analysis, Statistica, Spike/Signal
    • Social Science & Economics: Atlas.TI, Mplus, STATA, QSR Nvivo, MaxQDA, C++, SAS, Review Manager (systematic reviews), Comprehensive Meta-analysis (CMA), Python, MS Word, Transana, Mlwin, Oxmetrics, QGIS, Amos, Lyx, winedt, JASP, Wolfram Mathematica, Gephi, UCINET, NodeXL, ORA, ConText, Netdraw (social network analysis software), AmCat (Amsterdam Content Analysis Toolkit), Python package Pandas, ArcGIS, MaxQDA, GAMS, SAS, MaxQda, Atlas.TI, Mindmeister, SmartPLS, Dedoose, Lingo software, maxima, EQS, PQ method, MS Access, fs/QCA software, Lisrel.
    • Law: Atlas.TI, MS Word, various text mining tools
    • Arts & Humanities: MS Access, MS Word, MPlus, Atlas.TI, Python, MaxQDA, Concordance software e.g. AntConc, AmCat (Amsterdam Content Analysis Toolkit) Discipline Colors LegendLegend for the Discipline Colors

9.3 Writing

  • MS Word is the most favorite office tool for writing, but Engineering and technology use Google docs and Overleaf more often for collaborative writing than other disciplines.
  • For managing references Endnote is the most popular for all disciplines except in Engineering&Technology, where they prefer Mendeley. Endnote and Mendeley have a a similar user base, except that Mendeley is being used vastly more by younger researchers. Something to look into in the future, but we can imagine that many Phd candidates want something that works right after a download, instead of getting a license token from university administration. Where Mendeley is used lesser by Arst&Humanities, this group does use the open source reference manager Zotero much more than the other disciplines. Discipline Colors LegendLegend for the Discipline Colors

9.4 Publication

  • Despite the fact that all disciplines publish in traditional journals, it are researchers in medicine and life sciences who publish in OA topical- and mega-journals.
  • To decide what journal to submit an article the Journal Citation Ranking is still a leading indicator for most disciplines, except for researchers in Law, where they seem to lead in journal assessement platforms with an open access focus like the directory of open access journals (DOAJ), quality open access market (QOAM), Sherpa/Romeo and Journalysis. Also mentioned was the Eigenfactor.org, advise from supervisors and peers, the metrics from Google Scholar, similarities and reputation from authors in reference lists
  • Most disciplines recognize ResearchGate as a place for archiving and sharing publications. Archiving scientific output to safeguard the corpus for future generations is not as common practice yet for all disciplines, but Physics and Engineering&Technology mostly use arXiv for years to publish prepints as a function to speed-up the scientific process and at the same time claim their finding at that particular date. Other expected patterns that are visible are Lifesciences and Medicine use PubmedCentral, and the institutional repository is known across all disciplines. Strangely SSRN is used by Law a lot more than expected in Social sciences. We expected BioRXiv to be familiar among the Life sciences, but the service just started a few years ago.
  • Although a plentitude of platforms are available to share and archive data, code and presentations, only Github and Bitbucket are mainly used by one discipline; Engineering&Technology. Other suggestions were given to share code, data and presentations: Open Science Framework (OSF), Dropbox, Onedrive, SURFdrive, SURFsara.nl, Institutional shared folder, SPSS, Survey Monkey, External Hard Drive, E-mail, EDUgroepen.nl, Openclinica, Mendeley Data, SVN, GEO, Gitlab, EGA, tranSMART, B2SHARE, own website, supplement of papers, Academia.edu, dedicated repositories, wetransfer. For archiving data in a discipline the best place to start is the Registry of Research Data Repositories at RE3data.org Discipline Colors LegendLegend for the Discipline Colors

9.5 Outreach

  • We see that Engineering & Technology use Google Scholar Citations for researcher profiles more than other disciplines. This might be related to the recent obligation from the faculty of sciences to use this channel as the official outlet of their academic work. Again ResearchGate pops up as a platform that is broadly used to display your work in all disciplines. Only where we see two dips for Arts & Humanities and Law at ResearchGate, we see them re-appear as spikes at Academia.edu. Also we see that these disciplines are less familiar with ORCiD than the other disciplines. The profiles at our own institution are familiar, but we hear complains about the lack of control and speed researches have to influence these pages. That’s why we respondents also mentioned own website a number of times.
  • Twitter is the most popular outlet for mentioning scientific findings to the public by Arts&Humanities and Social&Economics, but also Engineering&Technology like to use to show off their achievements. The same groups like to inform the public by placing infrormation on their work on websites or blogging platforms like Wordpress. In smaller numbers, but fairly distributed across all disciplines improve Wikipedia with their findings. The startup GrowKudos.com, which manages the distribution and measurement the impact of your work on blogs and social media networks (Tw,Fb,Ln), is still very unknown. Other mentions for public outlets were Newspapers, Facebook and Linked-in.
  • To archive posters and presentations is not common practice, but only for Engineering&Technology they use Slideshare, Figshare and Vimeo. Also mentioned was Prezi, Dropbox, Youtube, own website, ResearchGate and Academia.edu. Discipline Colors LegendLegend for the Discipline Colors

9.6 Assessement

  • Using services for peer review organized beyond that by journals is very unknown territory. The common practice is to go with the review process organised by a journal. Decoupling this process makes it possible to validate the research and maintain trust in the scientific findings, but publish and spread it to multiple platforms creating greater reach. Other methods mentioned was discuss with peers.
  • To measure the impact of one’s output, Lifesciences, Medicine, Physics and Social&Economics look at the Journal Citation Reports (JRC) and Web of Science as a reference. Other mentions were Google Scholar citation index and in Social&Economics the Eigenfactor. Also InspireBeta.net in high energy physics is a good example for profile metrics; pre-print/post-print ratio, citation breakdown in clusters, filter on self-citation, publication type, co-authors, keyword frequencies, publication timeline graph, etc. And the ERIM journal list (EJL) from the Erasmus University is a good example of assessment for impact, but scaled to a discipline specific area with their own additional criteria. Discipline Colors LegendLegend for the Discipline Colors

10 Detailed overview for each discipline

In this overview we show the graphs focussed on each discipline in each research phase. Next to these bars we will place additional bars, where you can compare the discipline against the OECD averages for that discipline, and the VU&VUmc average.

10.1 ArtsHumanities (N= 55 )

10.1.1 Discovery

10.1.2 Analysis

10.1.3 Writing

10.1.4 Publication

10.1.5 Outreach

10.1.6 Assessment

10.2 EngineeringTechnology (N= 35 )

10.2.1 Discovery

10.2.2 Analysis

10.2.3 Writing

10.2.4 Publication

10.2.5 Outreach

10.2.6 Assessment

10.3 Law (N= 26 )

10.3.1 Discovery

10.3.2 Analysis

10.3.3 Writing

10.3.4 Publication

10.3.5 Outreach

10.3.6 Assessment

10.4 LifeSciences (N= 144 )

10.4.1 Discovery

10.4.2 Analysis

10.4.3 Writing

10.4.4 Publication

10.4.5 Outreach

10.4.6 Assessment

10.5 Medicine (N= 181 )

10.5.1 Discovery

10.5.2 Analysis

10.5.3 Writing

10.5.4 Publication

10.5.5 Outreach

10.5.6 Assessment

10.6 PhysicalSciences (N= 39 )

10.6.1 Discovery

10.6.2 Analysis

10.6.3 Writing

10.6.4 Publication

10.6.5 Outreach

10.6.6 Assessment

10.7 SocialSciencesEconomics (N= 176 )

10.7.1 Discovery

10.7.2 Analysis

10.7.3 Writing

10.7.4 Publication

10.7.5 Outreach

10.7.6 Assessment

>>>>>>> Update repo from UBVU/vu101report (#1)